Label-Dependency Coding in Simple Recurrent Networks for Spoken Language Understanding
نویسندگان
چکیده
Modeling target label dependencies is important for sequence labeling tasks. This may become crucial in the case of Spoken Language Understanding (SLU) applications, especially for the slot-filling task where models have to deal often with a high number of target labels. Conditional Random Fields (CRF) were previously considered as the most efficient algorithm in these conditions. More recently, different architectures of Recurrent Neural Networks (RNNs) have been proposed for the SLU slot-filling task. Most of them, however, have been successfully evaluated on the simple ATIS database, on which it is difficult to draw significant conclusions. In this paper we propose new variants of RNNs able to learn efficiently and effectively label dependencies by integrating label embeddings. We show first that modeling label dependencies is useless on the (simple) ATIS database and unstructured models can produce state-of-the-art results on this benchmark. On ATIS our new variants achieve the same results as state-of-the-art models, while being much simpler. On the other hand, on the MEDIA benchmark, we show that the modification introduced in the proposed RNN outperforms traditional RNNs and CRF models.
منابع مشابه
Recurrent Neural Network Structured Output Prediction for Spoken Language Understanding
Recurrent Neural Networks (RNNs) have been widely used for sequence modeling due to their strong capabilities in modeling temporal dependencies. In this work, we study and evaluate the effectiveness of using RNNs for slot filling, a key task in Spoken Language Understanding (SLU), with special focus on modeling label sequence dependencies. Recent work on slot filling using RNNs did not model la...
متن کاملRecurrent Neural Networks with External Memory for Spoken Language Understanding
Recurrent Neural Networks (RNNs) have become increasingly popular for the task of language understanding. In this task, a semantic tagger is deployed to associate a semantic label to each word in an input sequence. The success of RNN may be attributed to its ability to memorise long-term dependence that relates the current-time semantic label prediction to the observations many time instances a...
متن کاملLabel-Dependencies Aware Recurrent Neural Networks
In the last few years, Recurrent Neural Networks (RNNs) have proved effective on several NLP tasks. Despite such great success, their ability to model sequence labeling is still limited. This lead research toward solutions where RNNs are combined with models which already proved effective in this domain, such as CRFs. In this work we propose a solution far simpler but very effective: an evoluti...
متن کاملRecurrent Neural Learning for Classifying Spoken Utterances
For telecommunications companies or banks, etc processing spontaneous lanaguage in helpdesk scenarios is important for automatic telephone interactions. However, the problem of understanding spontaneous spoken language is difficult. Learning techniques such as neural networks have the ability to learn in a robust manner. Recurrent networks have been used in neurocognitive or psycholinguisticall...
متن کاملIs it time to Switch to word embedding and recurrent neural networks for spoken language understanding?
Recently, word embedding representations have been investigated for slot filling in Spoken Language Understanding, along with the use of Neural Networks as classifiers. Neural Networks, especially Recurrent Neural Networks, that are specifically adapted to sequence labeling problems, have been applied successfully on the popular ATIS database. In this work, we make a comparison of this kind of ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017